Skip to content

Add MiniMax as third AI provider#57

Open
octo-patch wants to merge 1 commit intoviperrcrypto:mainfrom
octo-patch:feature/add-minimax-provider
Open

Add MiniMax as third AI provider#57
octo-patch wants to merge 1 commit intoviperrcrypto:mainfrom
octo-patch:feature/add-minimax-provider

Conversation

@octo-patch
Copy link

Summary

Adds MiniMax as a third AI provider alongside Anthropic and OpenAI. MiniMax exposes an OpenAI-compatible API with models like M2.7 (1M context window), M2.5, and M2.5-highspeed (204K context, fastest).

Changes

  • lib/minimax-auth.ts — Auth resolution with priority: override key → DB key → MINIMAX_API_KEY env var → proxy via MINIMAX_BASE_URL
  • lib/ai-client.tsMiniMaxAIClient class using OpenAI SDK with thinking-tag stripping for M2.5+ model responses
  • lib/settings.ts — MiniMax model cache + getProvider() now returns 'minimax' as a valid provider
  • app/api/settings/route.ts — MiniMax model validation (MiniMax-M2.7, MiniMax-M2.5, MiniMax-M2.5-highspeed), API key CRUD
  • app/api/settings/test/route.ts — MiniMax connection test endpoint
  • app/settings/page.tsx — 3-way provider toggle (Anthropic / OpenAI / MiniMax) with model selector and key management
  • .env.exampleMINIMAX_API_KEY and MINIMAX_BASE_URL documentation
  • README.md — MiniMax in tech stack table and configuration docs

Why MiniMax?

  • 1M context window (M2.7) — ideal for processing large batches of bookmarks in fewer API calls
  • OpenAI-compatible API — minimal integration effort, reuses the existing OpenAI SDK
  • Competitive pricing — good alternative for users who want to try different providers

Test plan

  • 30 tests added (12 auth unit, 14 client unit, 4 integration) — all passing
  • TypeScript compiles cleanly (npx tsc --noEmit)
  • Manual: select MiniMax provider in Settings, add API key, test connection
  • Manual: run categorization pipeline with MiniMax provider selected
  • Manual: verify thinking tags are stripped from M2.5 responses

13 files changed, 799 additions(+), 22 deletions(-)

MiniMax offers OpenAI-compatible API with models like M2.7 (1M context),
M2.5, and M2.5-highspeed. This adds full provider support including:

- MiniMaxAIClient with thinking-tag stripping for M2.5+ responses
- Auth resolution (override key > DB key > MINIMAX_API_KEY env var > proxy)
- Settings API: model validation, key storage, connection test endpoint
- Settings UI: 3-way provider toggle with MiniMax model selector
- Environment variables: MINIMAX_API_KEY, MINIMAX_BASE_URL
- 30 tests (12 unit for auth, 14 unit for client, 4 integration)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant